Consistency of minimum divergence estimators based on grouped data
نویسندگان
چکیده
منابع مشابه
Consistency of orthogonal series density estimators based on grouped observations
The aim of this note is to indicate that nonparametric orthogonal series estimators of probability densities retain the mean integrated square error (MISE) consistency when observations are grouped to the points of a uniform grid (prebinned). This kind of grouping is typical for computer rounding errors and may also be useful in data compression, before calculating estimates, e.g., using the FF...
متن کاملMinimum Distance Estimators for Nonparametric Models with Grouped Dependent Variables
This Version: January 2002 This paper develops minimum distance estimators for nonparametric models where the dependent variable is known only to fall in a specified group with observable thresholds, while its true value remains unobserved and possibly censored. Such data arise commonly in major U.S and U.K data sets where, e.g., the thresholds between which earnings fall are observed, but not ...
متن کاملRobust Tests Based on Minimum Density Power Divergence Estimators and Saddlepoint Approximations
The nonrobustness of classical tests for parametric models is a well known problem and various robust alternatives have been proposed in literature. Usually, the robust tests are based on first order asymptotic theory and their accuracy in small samples is often an open question. In this paper we propose tests which have both robustness properties, as well as good accuracy in small samples. The...
متن کاملGoodness-of-fit Tests and Minimum Power Divergence Estimators for Survival Data
Power-divergence statistics are proposed for grouped survival data. They are analogous to the power-divergence family of statistics proposed and studied in detail by Read and Cressie (1988) and Cressie and Read (1984) for contingency tables. The proposed statistics are useful for testing validity of parametric model assumptions in analyses of survival data. It is shown that these statistics hav...
متن کاملWeighted Sampling, Maximum Likelihood and Minimum Divergence Estimators
This paper explores Maximum Likelihood in parametric models in the context of Sanov type Large Deviation Probabilities. MLE in parametric models under weighted sampling is shown to be associated with the minimization of a specific divergence criterion defined with respect to the distribution of the weights. Some properties of the resulting inferential procedure are presented; Bahadur efficiency...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Statistics & Probability Letters
سال: 2007
ISSN: 0167-7152
DOI: 10.1016/j.spl.2006.11.021